Push-SAGA: A Decentralized Stochastic Algorithm With Variance Reduction Over Directed Graphs
نویسندگان
چکیده
In this letter, we propose Push-SAGA, a decentralized stochastic first-order method for finite-sum minimization over directed network of nodes. Push-SAGA combines node-level variance reduction to remove the uncertainty caused by gradients, network-level gradient tracking address distributed nature data, and push-sum consensus tackle information exchange. We show that achieves linear convergence exact solution smooth strongly convex problems is thus first linearly-convergent algorithm arbitrary connected graphs. also characterize regime in which speed-up compared its centralized counterpart network-independent rate. illustrate behavior properties with help numerical experiments on non-convex problems.
منابع مشابه
Stochastic Conjugate Gradient Algorithm with Variance Reduction
Conjugate gradient methods are a class of important methods for solving linear equations and nonlinear optimization. In our work, we propose a new stochastic conjugate gradient algorithm with variance reduction (CGVR) and prove its linear convergence with the Fletcher and Revves method for strongly convex and smooth functions. We experimentally demonstrate that the CGVR algorithm converges fast...
متن کاملZeroth-order Asynchronous Doubly Stochastic Algorithm with Variance Reduction
Zeroth-order (derivative-free) optimization attracts a lot of attention in machine learning, because explicit gradient calculations may be computationally expensive or infeasible. To handle large scale problems both in volume and dimension, recently asynchronous doubly stochastic zeroth-order algorithms were proposed. The convergence rate of existing asynchronous doubly stochastic zeroth order ...
متن کاملAccelerated Stochastic ADMM with Variance Reduction
Alternating Direction Method of Multipliers (ADMM) is a popular method in solving Machine Learning problems. Stochastic ADMM was firstly proposed in order to reduce the per iteration computational complexity, which is more suitable for big data problems. Recently, variance reduction techniques have been integrated with stochastic ADMM in order to get a fast convergence rate, such as SAG-ADMM an...
متن کاملA Proximal Stochastic Gradient Method with Progressive Variance Reduction
We consider the problem of minimizing the sum of two convex functions: one is the average of a large number of smooth component functions, and the other is a general convex function that admits a simple proximal mapping. We assume the whole objective function is strongly convex. Such problems often arise in machine learning, known as regularized empirical risk minimization. We propose and analy...
متن کاملStochastic Blockmodels for Directed Graphs
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your perso...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Control Systems Letters
سال: 2022
ISSN: ['2475-1456']
DOI: https://doi.org/10.1109/lcsys.2021.3090652